|
“Filter Bubble”
The ‘Crisis’ of Freedom of Thought
Arun Kumar Gond
Today, people are living in a time where platforms like Google, YouTube, Instagram, and other social-media are not just sources of entertainment; they deeply influence people’s thoughts, decisions, and even health choices. Whenever one searches for something; whether it’s about child vaccination, home remedies for headaches, or ways to manage mental stress; the information one receives is not simply based on query. It is shaped by past behaviour, preferences, location, social-network, and even the likes of friends.
Back in 2011, author Eli Pariser, in his well-known book The Filter Bubble: What the Internet Is Hiding from You, explained how search-engines and algorithms on the internet are gradually leading people into a world where they mostly see and hear things that align with what they already believe. The content that appears is filtered based on previous searches, browsing habits, and social connections; and often ends up reinforcing existing opinions. This creates a “filter bubble” around searchers. What makes this bubble truly dangerous is that it’s invisible, and people often don’t even realise they are trapped inside it. This filter is so clever and subtle that people don’t even realise the information is being filtered. For example, when one searches for something related to a health issue, the search engine decides what to show; and that decision is based on supplied data. Clicking on the same kind of results again and again, one unknowingly strengthens that bubble even more. The problem isn’t just that people are getting limited information; it’s also that this process can lead one toward false or misleading information, especially if the profile leans toward content that promotes misinformation.
And because these algorithms are highly complex and opaque, the average user neither understands them nor has any real control over them. In fact, some researchers have described this situation as a “gravitational black hole of information.” That is, once someone gets caught in the pull of misinformation, getting out of it becomes as difficult as light escaping from a black hole. Every new piece of information only reinforces existing beliefs, pulling the person deeper into that bubble. Technology alone isn’t to blame for this. People’s personal biases, cultural perspectives, and sources of information all play a significant role. But as the internet learns more about people’s habits, it begins to shape those personal biases into its algorithmic systems; making access to information more one-sided and imbalanced. By searching for something on the internet, watching a video, or commenting on a post, one leaves behind traces of preferences. This information builds a digital profile; what’s known in technical terms as user data. Then, the algorithms working behind the scenes of the internet analyse this data and decide what one will see next.
Apps like TikTok, Instagram Reels, and YouTube Shorts select videos for people based on age, gender, location, and past behaviour. If you’ve watched a few videos related to health or yoga, your feed will soon be filled with only that type of content. You may feel like you’re watching things by your own choice, but in reality; Google is no longer searching for you, it’s searching about you. Gradually, you get surrounded by a digital world that only shows you things that align with your existing thoughts; while hiding everything else. This is the filter bubble, a quiet and invisible trap that people fall into on their own. Social-media has now become a part of daily lives. On apps like Instagram Reels, YouTube Shorts, and TikTok, the content people view is what they are shown again and again. For example, if you’ve watched reels related to home remedies, Ayurveda, or beauty tips a few times on Instagram, your feed will soon be filled with similar content. The same happens on YouTube Shorts. If you watch a video about religious miracles or a particular ideology, you’ll start seeing more content that reflects that same line of thinking.
On TikTok, emotional stories; like someone healing from an illness without medication often go viral, and people start believing in them. Gradually, people find themselves surrounded by content that only reinforces their old beliefs. There’s barely any room left for new or different perspectives. This is what’s called the Information Black Hole; it silently pulls viewers in, and they don’t even realise that they are now living in a very narrow world. Back in 1961, a researcher named Stoner explained that when people stay only among those who think like them, their views tend to become even more extreme and one-sided. This is known as Group Polarization. Think about it: if you keep hearing the same opinions on social media every day; like people who are against vaccines, or those who believe in a single ideology; eventually, those views start to feel even more right to you. You stop listening to different perspectives, or you start thinking they must be wrong.
This has an impact on the society too. Where once conversations helped find common ground, now arguments and divisions are growing. People stop listening to each other. And truth is no longer measured by whether something is actually right or wrong; but by whether people in the “group” believe it. This is called truth based on social trust; and it can sometimes lead people away from real truth. To escape the filter bubble, the most important things are awareness and understanding.
In this era of new media, social-media platforms are constantly refining their algorithmic recommendation systems; aiming to keep users engaged and spending more time on their platforms. But when this very technology begins to filter information based on a user’s personal preferences, behaviours, and networks, it creates an invisible wall; what is called the filter bubble. This filter bubble traps people within similar types of information, limits the diversity of thought, and gradually leads to issues like group polarisation. As a result, people not only become cut off from new or opposing views, but also start believing that their own opinions are the only truth. To overcome this challenge, one needs more than just technical fixes; one needs social and educational efforts as well. People must work to increase algorithmic transparency, promote media literacy, and encourage users to develop the habit of seeking information from a variety of sources.
[The author is deeply grateful to his research supervisor, Dreyer Pathak, for his Suggestion.] [Author: Arun Kumar Gond, Research Scholar, Department of Sociology, University of Allahabad, India. His research focuses on the implications of social-media in rural society.]
Back to Home Page
Frontier
Vol 58, No. 13, Sep 21 - 27, 2025 |